995 resultados para Acoustic stimulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pt. 1. Deafening effects of noise on the cat -- Pt. 2. Histological effects of intense sound on the inner ear.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The literature investigated the effects of chronic baroque music auditory stimulation on the cardiovascular system. However, it lacks in the literature the acute effects of different styles of music on cardiac autonomic regulation. To evaluate the acute effects of baroque and heavy metal music on heart rate variability (HRV) in women. The study was performed in 21 healthy women between 18 and 30 years old. We excluded persons with previous experience with music instrument and those who had affinity with the song styles. All procedures were performed in the same sound-proof room. We analyzed HRV in the time (standard deviation of normal-to-normal respiratory rate (RR) intervals, root-mean square of differences between adjacent normal RR intervals in a time interval, and the percentage of adjacent RR intervals with a difference of duration greater than 50 ms) and frequency (low frequency [LF], high frequency [HF], and LF/HF ratio) domains. HRV was recorded at rest for 10 min. Subsequently they were exposed to baroque or heavy metal music for 5 min through an earphone. After the first music exposure they remained at rest for more 5 min and them they were exposed again to baroque or heavy metal music. The sequence of songs was randomized for each individual. The power analysis provided a minimal number of 18 subjects. Shapiro-Wilk to verify normality of data and analysis of variance for repeated measures followed by the Bonferroni test for parametric variables and Friedman's followed by the Dunn's post-test for non-parametric distributions. During the analysis of the time-domain indices were not changed. In the frequency-domain analysis, the LF in absolute units was reduced during the heavy metal music stimulation compared to control. Acute exposure to heavy metal music affected the sympathetic activity in healthy women.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This dissertation studies the manipulation of particles using acoustic stimulation for applications in microfluidics and templating of devices. The term particle is used here to denote any solid, liquid or gaseous material that has properties, which are distinct from the fluid in which it is suspended. Manipulation means to take over the movements of the particles and to position them in specified locations. Using devices, microfabricated out of silicon, the behavior of particles under the acoustic stimulation was studied with the main purpose of aligning the particles at either low-pressure zones, known as the nodes or high-pressure zones, known as anti-nodes. By aligning particles at the nodes in a flow system, these particles can be focused at the center or walls of a microchannel in order to ultimately separate them. These separations are of high scientific importance, especially in the biomedical domain, since acoustopheresis provides a unique approach to separate based on density and compressibility, unparalleled by other techniques. The study of controlling and aligning the particles in various geometries and configurations was successfully achieved by controlling the acoustic waves. Apart from their use in flow systems, a stationary suspended-particle device was developed to provide controllable light transmittance based on acoustic stimuli. Using a glass compartment and a carbon-particle suspension in an organic solvent, the device responded to acoustic stimulation by aligning the particles. The alignment of light-absorbing carbon particles afforded an increase in visible light transmittance as high as 84.5%, and it was controlled by adjusting the frequency and amplitude of the acoustic wave. The device also demonstrated alignment memory rendering it energy-efficient. A similar device for suspended-particles in a monomer enabled the development of electrically conductive films. These films were based on networks of conductive particles. Elastomers doped with conductive metal particles were rendered surface conductive at particle loadings as low as 1% by weight using acoustic focusing. The resulting films were flexible and had transparencies exceeding 80% in the visible spectrum (400-800 nm) These films had electrical bulk conductivities exceeding 50 S/cm.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This dissertation studies the manipulation of particles using acoustic stimulation for applications in microfluidics and templating of devices. The term particle is used here to denote any solid, liquid or gaseous material that has properties, which are distinct from the fluid in which it is suspended. Manipulation means to take over the movements of the particles and to position them in specified locations. ^ Using devices, microfabricated out of silicon, the behavior of particles under the acoustic stimulation was studied with the main purpose of aligning the particles at either low-pressure zones, known as the nodes or high-pressure zones, known as anti-nodes. By aligning particles at the nodes in a flow system, these particles can be focused at the center or walls of a microchannel in order to ultimately separate them. These separations are of high scientific importance, especially in the biomedical domain, since acoustopheresis provides a unique approach to separate based on density and compressibility, unparalleled by other techniques. The study of controlling and aligning the particles in various geometries and configurations was successfully achieved by controlling the acoustic waves. ^ Apart from their use in flow systems, a stationary suspended-particle device was developed to provide controllable light transmittance based on acoustic stimuli. Using a glass compartment and a carbon-particle suspension in an organic solvent, the device responded to acoustic stimulation by aligning the particles. The alignment of light-absorbing carbon particles afforded an increase in visible light transmittance as high as 84.5%, and it was controlled by adjusting the frequency and amplitude of the acoustic wave. The device also demonstrated alignment memory rendering it energy-efficient. A similar device for suspended-particles in a monomer enabled the development of electrically conductive films. These films were based on networks of conductive particles. Elastomers doped with conductive metal particles were rendered surface conductive at particle loadings as low as 1% by weight using acoustic focusing. The resulting films were flexible and had transparencies exceeding 80% in the visible spectrum (400-800 nm) These films had electrical bulk conductivities exceeding 50 S/cm. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Crickets have two tympanal membranes on the tibiae of each foreleg. Among several field cricket species of the genus Gryllus (Gryllinae), the posterior tympanal membrane (PTM) is significantly larger than the anterior membrane (ATM). Laser Doppler vibrometric measurements have shown that the smaller ATM does not respond as much as the PTM to sound. Hence the PTM has been suggested to be the principal tympanal acoustic input to the auditory organ. In tree crickets (Oecanthinae), the ATM is slightly larger than the PTM. Both membranes are structurally complex, presenting a series of transverse folds on their surface, which are more pronounced on the ATM than on the PTM. The mechanical response of both membranes to acoustic stimulation was investigated using microscanning laser Doppler vibrometry. Only a small portion of the membrane surface deflects in response to sound. Both membranes exhibit similar frequency responses, and move out of phase with each other, producing compressions and rarefactions of the tracheal volume backing the tympanum. Therefore, unlike field crickets, tree crickets may have four instead of two functional tympanal membranes. This is interesting in the context of the outstanding question of the role of spiracular inputs in the auditory system of tree crickets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

While cochlear implants (CIs) usually provide high levels of speech recognition in quiet, speech recognition in noise remains challenging. To overcome these difficulties, it is important to understand how implanted listeners separate a target signal from interferers. Stream segregation has been studied extensively in both normal and electric hearing, as a function of place of stimulation. However, the effects of pulse rate, independent of place, on the perceptual grouping of sequential sounds in electric hearing have not yet been investigated. A rhythm detection task was used to measure stream segregation. The results of this study suggest that while CI listeners can segregate streams based on differences in pulse rate alone, the amount of stream segregation observed decreases as the base pulse rate increases. Further investigation of the perceptual dimensions encoded by the pulse rate and the effect of sequential presentation of different stimulation rates on perception could be beneficial for the future development of speech processing strategies for CIs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The affective impact of music arises from a variety of factors, including intensity, tempo, rhythm, and tonal relationships. The emotional coloring evoked by intensity, tempo, and rhythm appears to arise from association with the characteristics of human behavior in the corresponding condition; however, how and why particular tonal relationships in music convey distinct emotional effects are not clear. The hypothesis examined here is that major and minor tone collections elicit different affective reactions because their spectra are similar to the spectra of voiced speech uttered in different emotional states. To evaluate this possibility the spectra of the intervals that distinguish major and minor music were compared to the spectra of voiced segments in excited and subdued speech using fundamental frequency and frequency ratios as measures. Consistent with the hypothesis, the spectra of major intervals are more similar to spectra found in excited speech, whereas the spectra of particular minor intervals are more similar to the spectra of subdued speech. These results suggest that the characteristic affective impact of major and minor tone collections arises from associations routinely made between particular musical intervals and voiced speech.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Infants' speech perception abilities change through the first year of life, from broad sensitivity to a wide range of speech contrasts to becoming more finely attuned to their native language. What remains unclear, however, is how this perceptual change relates to brain responses to native language contrasts in terms of the functional specialization of the left and right hemispheres. Here, to elucidate the developmental changes in functional lateralization accompanying this perceptual change, we conducted two experiments on Japanese infants using Japanese lexical pitch-accent, which changes word meanings with the pitch pattern within words. In the first behavioral experiment, using visual habituation, we confirmed that infants at both 4 and 10 months have sensitivities to the lexical pitch-accent pattern change embedded in disyllabic words. In the second experiment, near-infrared spectroscopy was used to measure cortical hemodynamic responses in the left and right hemispheres to the same lexical pitch-accent pattern changes and their pure tone counterparts. We found that brain responses to the pitch change within words differed between 4- and 10-month-old infants in terms of functional lateralization: Left hemisphere dominance for the perception of the pitch change embedded in words was seen only in the 10-month-olds. These results suggest that the perceptual change in Japanese lexical pitch-accent may be related to a shift in functional lateralization from bilateral to left hemisphere dominance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Older adults recall less episodically rich autobiographical memories (AM), however, the neural basis of this effect is not clear. Using functional MRI, we examined the effects of age during search and elaboration phases of AM retrieval. Our results suggest that the age-related attenuation in the episodic richness of AMs is associated with difficulty in the strategic retrieval processes underlying recovery of information during elaboration. First, age effects on AM activity were more pronounced during elaboration than search, with older adults showing less sustained recruitment of the hippocampus and ventrolateral prefrontal cortex (VLPFC) for less episodically rich AMs. Second, there was an age-related reduction in the modulation of top-down coupling of the VLPFC on the hippocampus for episodically rich AMs. In sum, the present study shows that changes in the sustained response and coupling of the hippocampus and prefrontal cortex (PFC) underlie age-related reductions in episodic richness of the personal past.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Remembering past events - or episodic retrieval - consists of several components. There is evidence that mental imagery plays an important role in retrieval and that the brain regions supporting imagery overlap with those supporting retrieval. An open issue is to what extent these regions support successful vs. unsuccessful imagery and retrieval processes. Previous studies that examined regional overlap between imagery and retrieval used uncontrolled memory conditions, such as autobiographical memory tasks, that cannot distinguish between successful and unsuccessful retrieval. A second issue is that fMRI studies that compared imagery and retrieval have used modality-aspecific cues that are likely to activate auditory and visual processing regions simultaneously. Thus, it is not clear to what extent identified brain regions support modality-specific or modality-independent imagery and retrieval processes. In the current fMRI study, we addressed this issue by comparing imagery to retrieval under controlled memory conditions in both auditory and visual modalities. We also obtained subjective measures of imagery quality allowing us to dissociate regions contributing to successful vs. unsuccessful imagery. Results indicated that auditory and visual regions contribute both to imagery and retrieval in a modality-specific fashion. In addition, we identified four sets of brain regions with distinct patterns of activity that contributed to imagery and retrieval in a modality-independent fashion. The first set of regions, including hippocampus, posterior cingulate cortex, medial prefrontal cortex and angular gyrus, showed a pattern common to imagery/retrieval and consistent with successful performance regardless of task. The second set of regions, including dorsal precuneus, anterior cingulate and dorsolateral prefrontal cortex, also showed a pattern common to imagery and retrieval, but consistent with unsuccessful performance during both tasks. Third, left ventrolateral prefrontal cortex showed an interaction between task and performance and was associated with successful imagery but unsuccessful retrieval. Finally, the fourth set of regions, including ventral precuneus, midcingulate cortex and supramarginal gyrus, showed the opposite interaction, supporting unsuccessful imagery, but successful retrieval performance. Results are discussed in relation to reconstructive, attentional, semantic memory, and working memory processes. This is the first study to separate the neural correlates of successful and unsuccessful performance for both imagery and retrieval and for both auditory and visual modalities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a recent study, we reported that the accurate perception of beat structure in music ('perception of musical meter') accounted for over 40% of the variance in single word reading in children with and without dyslexia (Huss et al., 2011). Performance in the musical task was most strongly associated with the auditory processing of rise time, even though beat structure was varied by manipulating the duration of the musical notes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multisensory stimuli can improve performance, facilitating RTs on sensorimotor tasks. This benefit is referred to as the redundant signals effect (RSE) and can exceed predictions on the basis of probability summation, indicative of integrative processes. Although an RSE exceeding probability summation has been repeatedly observed in humans and nonprimate animals, there are scant and inconsistent data from nonhuman primates performing similar protocols. Rather, existing paradigms have instead focused on saccadic eye movements. Moreover, the extant results in monkeys leave unresolved how stimulus synchronicity and intensity impact performance. Two trained monkeys performed a simple detection task involving arm movements to auditory, visual, or synchronous auditory-visual multisensory pairs. RSEs in excess of predictions on the basis of probability summation were observed and thus forcibly follow from neural response interactions. Parametric variation of auditory stimulus intensity revealed that in both animals, RT facilitation was limited to situations where the auditory stimulus intensity was below or up to 20 dB above perceptual threshold, despite the visual stimulus always being suprathreshold. No RT facilitation or even behavioral costs were obtained with auditory intensities 30-40 dB above threshold. The present study demonstrates the feasibility and the suitability of behaving monkeys for investigating links between psychophysical and neurophysiologic instantiations of multisensory interactions.